Parallel Block Coordinate Minimization with Application to Group Regularized Regression
نویسنده
چکیده
This paper proposes a method for parallel block coordinate-wise minimization for convex functions. Each iteration involves a first phase where n independent minimizations are performed over the n variable blocks, followed by a phase where the results of the first phase are coordinated to obtain the whole variable update. Convergence of the method to the global optimum is proved for functions composed of a smooth part plus a possibly non-smooth but separable term. The method is also proved to have linear rate of convergence, for functions that are smooth and strongly convex. The proposed algorithm can give computational advantage over the more standard serial block coordinate-wise minimization methods, when run over a parallel, multiworker, computing architecture. The method is suitable for regularized regression problems, such as the group Lasso, group Ridge regression, and goup Elastic Net. Numerical tests are run on such type of regression problems to exemplify the performance of the proposed parallel method in comparison with the serial one.
منابع مشابه
Feature Clustering for Accelerating Parallel Coordinate Descent
Large-scale `1-regularized loss minimization problems arise in high-dimensional applications such as compressed sensing and high-dimensional supervised learning, including classification and regression problems. High-performance algorithms and implementations are critical to efficiently solving these problems. Building upon previous work on coordinate descent algorithms for `1-regularized probl...
متن کاملA Parallel Best-Response Algorithm with Exact Line Search for Nonconvex Sparsity-Regularized Rank Minimization
In this paper, we propose a convergent parallel best-response algorithm with the exact line search for the nondifferentiable nonconvex sparsity-regularized rank minimization problem. On the one hand, it exhibits a faster convergence than subgradient algorithms and block coordinate descent algorithms. On the other hand, its convergence to a stationary point is guaranteed, while ADMM algorithms o...
متن کاملA coordinate gradient descent method for ℓ1-regularized convex minimization
In applications such as signal processing and statistics, many problems involve finding sparse solutions to under-determined linear systems of equations. These problems can be formulated as a structured nonsmooth optimization problems, i.e., the problem of minimizing `1-regularized linear least squares problems. In this paper, we propose a block coordinate gradient descent method (abbreviated a...
متن کاملAccelerated Block-coordinate Relaxation for Regularized Optimization
We discuss minimization of a smooth function to which is added a regularization function that induces structure in the solution. A block-coordinate relaxation approach with proximal linearized subproblems yields convergence to stationary points, while identification of the optimal manifold (under a nondegeneracy condition) allows acceleration techniques to be applied on a reduced space. The wor...
متن کاملParallel Coordinate Descent for L1-Regularized Loss Minimization
We propose Shotgun, a parallel coordinate descent algorithm for minimizing L1regularized losses. Though coordinate descent seems inherently sequential, we prove convergence bounds for Shotgun which predict linear speedups, up to a problemdependent limit. We present a comprehensive empirical study of Shotgun for Lasso and sparse logistic regression. Our theoretical predictions on the potential f...
متن کامل